A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization

نویسندگان

  • Frank E. Curtis
  • Michael L. Overton
چکیده

We consider optimization problems with objective and constraint functions that may be nonconvex and nonsmooth. Problems of this type arise in important applications, many having solutions at points of nondifferentiability of the problem functions. We present a line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable on open dense subsets of Rn. Our method is based on a sequential quadratic programming (SQP) algorithm that uses an `1 penalty to regularize the constraints. A process of gradient sampling (GS) is employed to make the search direction computation effective in nonsmooth regions. We prove that our SQP-GS method is globally convergent to stationary points with probability one and illustrate its performance with a MATLAB implementation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A feasible second order bundle algorithm for nonsmooth, nonconvex optimization problems with inequality constraints: I. Derivation and convergence

This paper extends the SQP-approach of the well-known bundle-Newton method for nonsmooth unconstrained minimization to the nonlinearly constrained case. Instead of using a penalty function or a filter or an improvement function to deal with the presence of constraints, the search direction is determined by solving a convex quadratically constrained quadratic program to obtain good iteration poi...

متن کامل

A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles

We propose an algorithm for solving nonsmooth, nonconvex, constrained optimization problems as well as a new set of visualization tools for comparing the performance of optimization algorithms. Our algorithm is a sequential quadratic optimization method that employs BFGS quasi-Newton Hessian approximations and an exact penalty function whose parameter is controlled using a steering strategy. We...

متن کامل

Benson's algorithm for nonconvex multiobjective problems via nonsmooth Wolfe duality

‎In this paper‎, ‎we propose an algorithm to obtain an approximation set of the (weakly) nondominated points of nonsmooth multiobjective optimization problems with equality and inequality constraints‎. ‎We use an extension of the Wolfe duality to construct the separating hyperplane in Benson's outer algorithm for multiobjective programming problems with subdifferentiable functions‎. ‎We also fo...

متن کامل

Robust and efficient methods for approximation and optimization of stability measures

We consider two new algorithms with practical application to the problem of designing controllers for linear dynamical systems with input and output: a new spectral value set based algorithm called hybrid expansion-contraction intended for approximating the H∞ norm, or equivalently, the complex stability radius, of large-scale systems, and a new BFGS SQP based optimization method for nonsmooth,...

متن کامل

A Second Derivative Sqp Method with Imposed

Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 22  شماره 

صفحات  -

تاریخ انتشار 2012